Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization

By combining with sparse kernel methods, least-squares temporal difference (LSTD) algorithms can construct the feature dictionary automatically and obtain a better generalization ability. However, the previous kernel-based LSTD algorithms do not consider regularization and their sparsification processes are batch or offline, which hinder their widespread applications in online learning problems...

متن کامل

Kernel Least-Squares Temporal Difference Learning

Kernel methods have attracted many research interests recently since by utilizing Mercer kernels, non-linear and non-parametric versions of conventional supervised or unsupervised learning algorithms can be implemented and usually better generalization abilities can be obtained. However, kernel methods in reinforcement learning have not been popularly studied in the literature. In this paper, w...

متن کامل

Kernel Recursive Least Squares

We present a non-linear kernel-based version of the Recursive Least Squares (RLS) algorithm. Our Kernel-RLS algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum meansquared-error regressor. Sparsity (and therefore regularization) of the solution is achieved by an explicit greedy sparsification proces...

متن کامل

Hierarchic Kernel Recursive Least-Squares

We present a new hierarchic kernel based modeling technique for modeling evenly distributed multidimensional datasets that does not rely on input space sparsification. The presented method reorganizes the typical single-layer kernel based model in a hierarchical structure, such that the weights of a kernel model over each dimension are modeled over the adjacent dimension. We show that the impos...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computational Intelligence and Neuroscience

سال: 2016

ISSN: 1687-5265,1687-5273

DOI: 10.1155/2016/2305854